Recursive Exponential Weighting for Online Non-convex Optimization

نویسندگان

  • Lin Yang
  • Cheng Tan
  • Wing Shing Wong
چکیده

In this paper, we investigate the online non-convex optimization problem which generalizes the classic online convex optimization problem by relaxing the convexity assumption on the cost function. For this type of problem, the classic exponential weighting online algorithm has recently been shown to attain a sub-linear regret of O( √ T log T ). In this paper, we introduce a novel recursive structure to the online algorithm to define a recursive exponential weighting algorithm that attains a regret of O( √ T ), matching the well-known regret lower bound. To the best of our knowledge, this is the first online algorithm with provable O( √ T ) regret for the online non-convex optimization problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential Change-Point Detection via Online Convex Optimization

Sequential change-point detection when the distribution parameters are unknown is a fundamental problem in statistics and machine learning. When the post-change parameters are unknown, we consider a set of detection procedures based on sequential likelihood ratios with non-anticipating estimators constructed using online convex optimization algorithms such as online mirror descent, which provid...

متن کامل

Online optimization for variable selection in data streams

Variable selection for regression is a classical statistical problem, motivated by concerns that too many covariates invite overfitting. Existing approaches notably include a class of convex optimisation techniques, such as the Lasso algorithm. Such techniques are invariably reliant on assumptions that are unrealistic in streaming contexts, namely that the data is available off-line and the cor...

متن کامل

A Linearly Convergent Conditional Gradient Algorithm with Applications to Online and Stochastic Optimization

Linear optimization is many times algorithmically simpler than non-linear convex optimization. Linear optimization over matroid polytopes, matching polytopes and path polytopes are example of problems for which we have simple and efficient combinatorial algorithms, but whose non-linear convex counterpart is harder and admit significantly less efficient algorithms. This motivates the computation...

متن کامل

Adaptive Weighted Least Squares Algorithm for Volterra Signal Modeling

This paper presents a novel algorithm for least squares (LS) estimation of both stationary and nonstationary signals which arise from Volterra models. The algorithm concerns the recursive implementations of the method of LS which usually have a weighting factor in the cost function. This weighting factor enables nonstationary signal models to be tracked. In particular, the behavior of the weigh...

متن کامل

Recursive algorithms for inner ellipsoidal approximation of convex polytopes

In this paper, fast recursive algorithms for the approximation of an n-dimensional convex polytope by means of an inscribed ellipsoid are presented. These algorithms consider at each step a single inequality describing the polytope and, under mild assumptions, they are guaranteed to converge in a 5nite number of steps. For their recursive nature, the proposed algorithms are better suited to tre...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1709.04136  شماره 

صفحات  -

تاریخ انتشار 2017